Near Linear Lower Bound for Dimension Reduction in `1
نویسندگان
چکیده
Given a set of n points in `1, how many dimensions are needed to represent all pairwise distances within a specific distortion? This dimension-distortion tradeoff question is well understood for the `2 norm, where O((logn)/ ) dimensions suffice to achieve 1 + distortion. In sharp contrast, there is a significant gap between upper and lower bounds for dimension reduction in `1. A recent result shows that distortion 1 + can be achieved with n/ 2 dimensions. On the other hand, the only lower bounds known are that distortion δ requires n 2) dimensions and that distortion 1+ requires n1/2−O( log(1/ )) dimensions. In this work, we show the first near linear lower bounds for dimension reduction in `1. In particular, we show that 1 + distortion requires at least n1−O(1/ log(1/ )) dimensions. Our proofs are combinatorial, but inspired by linear programming. In fact, our techniques lead to a simple combinatorial argument that is equivalent to the LP based proof of BrinkmanCharikar for lower bounds on dimension reduction in `1. Keywords-dimension reduction, metric embedding
منابع مشابه
Bearing Capacity of Strip Footings near Slopes Using Lower Bound Limit Analysis
Stability of foundations near slopes is one of the important and complicated problems in geotechnical engineering, which has been investigated by various methods such as limit equilibrium, limit analysis, slip-line, finite element and discrete element. The complexity of this problem is resulted from the combination of two probable failures: foundation failure and overall slope failure. The curr...
متن کاملMinimizing Makespan with Start Time Dependent Jobs in a Two Machine Flow Shop
[if gte mso 9]> The purpose of this paper is to consider the problem of scheduling a set of start time-dependent jobs in a two-machine flow shop, in which the actual processing times of jobs increase linearly according to their starting time. The objective of this problem is to minimize the makespan. The problem is known to be NP-hardness[ah1] ; therefore, there is no polynomial-time algorithm...
متن کاملOn the Impossibility of Dimension Reduction in `1
The Johnson-Lindenstrauss Lemma shows that any n points in Euclidean space (with distances measured by the `2 norm) may be mapped down to O((log n)/ε) dimensions such that no pairwise distance is distorted by more than a (1+ε) factor. Determining whether such dimension reduction is possible in `1 has been an intriguing open question. We show strong lower bounds for general dimension reduction i...
متن کاملDimension Reduction in the l1 norm
The Johnson-Lindenstrauss Lemma shows that any set of n points in Euclidean space can be mapped linearly down to O((log n)/ǫ) dimensions such that all pairwise distances are distorted by at most 1 + ǫ. We study the following basic question: Does there exist an analogue of the JohnsonLindenstrauss Lemma for the l1 norm? Note that Johnson-Lindenstrauss Lemma gives a linear embedding which is inde...
متن کاملSample Complexity of Testing the Manifold Hypothesis
The hypothesis that high dimensional data tends to lie in the vicinity of a low dimensional manifold is the basis of a collection of methodologies termed Manifold Learning. In this paper, we study statistical aspects of the question of fitting a manifold with a nearly optimal least squared error. Given upper bounds on the dimension, volume, and curvature, we show that Empirical Risk Minimizatio...
متن کامل